Instrumental variable regression via kernel maximum moment loss

نویسندگان

چکیده

Abstract We investigate a simple objective for nonlinear instrumental variable (IV) regression based on kernelized conditional moment restriction known as maximum (MMR). The MMR is formulated by maximizing the interaction between residual and instruments belonging to unit ball in reproducing kernel Hilbert space. First, it allows us simplify IV an empirical risk minimization problem, where function depends instrument can be estimated U -statistic or V -statistic. Second, basis this simplification, we are able provide consistency asymptotic normality results both parametric nonparametric settings. Finally, easy-to-use algorithms with efficient hyperparameter selection procedure. demonstrate effectiveness of our using experiments synthetic real-world data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Instrumental Variable Quantile Regression * †

Quantile regression is an increasingly important tool that estimates the conditional quantiles of a response Y given a vector of regressors D. It usefully generalizes Laplace’s median regression and can be used to measure the effect of covariates not only in the center of a distribution, but also in the upper and lower tails. For the linear quantile model defined by Y = D′γ(U) where D′γ(U) is s...

متن کامل

Doubly Robust Instrumental Variable Regression

Instrumental variable (IV) estimation typically requires the user to correctly specify the relationship between the regressors and the outcome to obtain a consistent estimate of the effects of the treatments. This paper proposes doubly robust IV regression estimators that only require the user to either correctly specify the relationship between the measured confounding variables (i.e., include...

متن کامل

Oracle Inequality for Instrumental Variable Regression

where φ is the parameter of interest which models the relationship while U is an error term. Contrary to usual statistical regression models, the error term is correlated with the explanatory variables X, hence E(U |X) 6= 0, preventing direct estimation of φ. To overcome the endogeneity of X, we assume that there exists an observed random variable W , called the instrument, which decorrelates t...

متن کامل

Bayesian Approximate Kernel Regression with Variable Selection

Nonlinear kernel regression models are often used in statistics and machine learning due to greater accuracy than linear models. Variable selection for kernel regression models is a challenge partly because, unlike the linear regression setting, there is no clear concept of an effect size for regression coefficients. In this paper, we propose a novel framework that provides an analog of the eff...

متن کامل

Kernel Ridge Regression via Partitioning

In this paper, we investigate a divide and conquer approach to Kernel Ridge Regression (KRR). Given n samples, the division step involves separating the points based on some underlying disjoint partition of the input space (possibly via clustering), and then computing a KRR estimate for each partition. The conquering step is simple: for each partition, we only consider its own local estimate fo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of causal inference

سال: 2023

ISSN: ['2193-3677', '2193-3685']

DOI: https://doi.org/10.1515/jci-2022-0073